849 research outputs found

    The Cosmological Origin of Primordial Magnetic Fields

    Get PDF

    Cadre Modeling: Simultaneously Discovering Subpopulations and Predictive Models

    Full text link
    We consider the problem in regression analysis of identifying subpopulations that exhibit different patterns of response, where each subpopulation requires a different underlying model. Unlike statistical cohorts, these subpopulations are not known a priori; thus, we refer to them as cadres. When the cadres and their associated models are interpretable, modeling leads to insights about the subpopulations and their associations with the regression target. We introduce a discriminative model that simultaneously learns cadre assignment and target-prediction rules. Sparsity-inducing priors are placed on the model parameters, under which independent feature selection is performed for both the cadre assignment and target-prediction processes. We learn models using adaptive step size stochastic gradient descent, and we assess cadre quality with bootstrapped sample analysis. We present simulated results showing that, when the true clustering rule does not depend on the entire set of features, our method significantly outperforms methods that learn subpopulation-discovery and target-prediction rules separately. In a materials-by-design case study, our model provides state-of-the-art prediction of polymer glass transition temperature. Importantly, the method identifies cadres of polymers that respond differently to structural perturbations, thus providing design insight for targeting or avoiding specific transition temperature ranges. It identifies chemically meaningful cadres, each with interpretable models. Further experimental results show that cadre methods have generalization that is competitive with linear and nonlinear regression models and can identify robust subpopulations.Comment: 8 pages, 6 figure

    Neural Basis Functions for Accelerating Solutions to High Mach Euler Equations

    Full text link
    We propose an approach to solving partial differential equations (PDEs) using a set of neural networks which we call Neural Basis Functions (NBF). This NBF framework is a novel variation of the POD DeepONet operator learning approach where we regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis. These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE. This approach is applied to the steady state Euler equations for high speed flow conditions (mach 10-30) where we consider the 2D flow around a cylinder which develops a shock condition. We then use the NBF predictions as initial conditions to a high fidelity Computational Fluid Dynamics (CFD) solver (CFD++) to show faster convergence. Lessons learned for training and implementing this algorithm will be presented as well.Comment: Published at ICML 2022 AI for Science workshop: https://openreview.net/forum?id=dvqjD3peY5

    Curvature-informed multi-task learning for graph networks

    Full text link
    Properties of interest for crystals and molecules, such as band gap, elasticity, and solubility, are generally related to each other: they are governed by the same underlying laws of physics. However, when state-of-the-art graph neural networks attempt to predict multiple properties simultaneously (the multi-task learning (MTL) setting), they frequently underperform a suite of single property predictors. This suggests graph networks may not be fully leveraging these underlying similarities. Here we investigate a potential explanation for this phenomenon: the curvature of each property's loss surface significantly varies, leading to inefficient learning. This difference in curvature can be assessed by looking at spectral properties of the Hessians of each property's loss function, which is done in a matrix-free manner via randomized numerical linear algebra. We evaluate our hypothesis on two benchmark datasets (Materials Project (MP) and QM8) and consider how these findings can inform the training of novel multi-task learning models.Comment: Published at the ICML 2022 AI for Science workshop: https://openreview.net/forum?id=m5RYtApKFO

    Evaluating the diversity and utility of materials proposed by generative models

    Full text link
    Generative machine learning models can use data generated by scientific modeling to create large quantities of novel material structures. Here, we assess how one state-of-the-art generative model, the physics-guided crystal generation model (PGCGM), can be used as part of the inverse design process. We show that the default PGCGM's input space is not smooth with respect to parameter variation, making material optimization difficult and limited. We also demonstrate that most generated structures are predicted to be thermodynamically unstable by a separate property-prediction model, partially due to out-of-domain data challenges. Our findings suggest how generative models might be improved to enable better inverse design.Comment: 12 pages, 9 figures. Published at SynS & ML @ ICML2023: https://openreview.net/forum?id=2ZYbmYTKo

    The Octave (Birmingham - Sheffield Hallam) automated pipeline for extracting oscillation parameters of solar-like main-sequence stars

    Full text link
    The number of main-sequence stars for which we can observe solar-like oscillations is expected to increase considerably with the short-cadence high-precision photometric observations from the NASA Kepler satellite. Because of this increase in number of stars, automated tools are needed to analyse these data in a reasonable amount of time. In the framework of the asteroFLAG consortium, we present an automated pipeline which extracts frequencies and other parameters of solar-like oscillations in main-sequence and subgiant stars. The pipeline uses only the timeseries data as input and does not require any other input information. Tests on 353 artificial stars reveal that we can obtain accurate frequencies and oscillation parameters for about three quarters of the stars. We conclude that our methods are well suited for the analysis of main-sequence stars, which show mainly p-mode oscillations.Comment: accepted by MNRA
    corecore